Some notes on perceptron learning

نویسنده

  • Marco Budinich
چکیده

We extend the geometkcal approach to the Perceptron and show that. given n examples, learning is of maximal difficulty when the number of inputs dis such that n = 5d. We then present a nea' Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have k e d parameters, like the usual learning constant q, but it adapts them to the cost function. We show that there exist an optimal choice forg, the steepness of the transfer function. We present also a brief systematic study of the parameters and b of the standard Perceptron algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comp236: Computational Learning Theory the Winnow Algorithm

These notes are slightly edited from scribe notes in previous years. In lecture we only covered sections 1 and 3 of these notes, but we provide the complete details for completeness. 1 The Winnow Algorithm Like the Perceptron algorithm, The Winnow Algorithm learns linear threshold hypotheses. The algorithm and its analysis are specialized to inputs in {0, 1} n , that is, when the features are b...

متن کامل

UTA DLNLP at SemEval-2016 Task 12: Deep Learning Based Natural Language Processing System for Clinical Information Identification from Clinical Notes and Pathology Reports

We propose a deep neural network based natural language processing system for clinical information (such as time information, event spans, and their attributes) extraction from raw clinical notes and pathology reports. Our approach uses the context words and their partof-speech tags and shape information as features. We utilize the temporal (1D) convolution neural network to learn the hidden fe...

متن کامل

A TS Fuzzy Model Derived from a Typical Multi-Layer Perceptron

In this paper, we introduce a Takagi-Sugeno (TS) fuzzy model which is derived from a typical Multi-Layer Perceptron Neural Network (MLP NN). At first, it is shown that the considered MLP NN can be interpreted as a variety of TS fuzzy model. It is discussed that the utilized Membership Function (MF) in such TS fuzzy model, despite its flexible structure, has some major restrictions. After modify...

متن کامل

Cs229 Lecture Notes Generative Learning Algorithms

So far, we’ve mainly been talking about learning algorithms that model p(y|x; θ), the conditional distribution of y given x. For instance, logistic regression modeled p(y|x; θ) as hθ(x) = g(θx) where g is the sigmoid function. In these notes, we’ll talk about a different type of learning algorithm. Consider a classification problem in which we want to learn to distinguish between elephants (y =...

متن کامل

CS 229 Lecture notes Andrew

So far, we’ve mainly been talking about learning algorithms that model p(y|x; θ), the conditional distribution of y given x. For instance, logistic regression modeled p(y|x; θ) as hθ(x) = g(θ T x) where g is the sigmoid function. In these notes, we’ll talk about a different type of learning algorithm. Consider a classification problem in which we want to learn to distinguish between elephants (...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993